Google’s TPU Expansion Rattles Nvidia as AI Chip Wars Intensify
Google's Gemini 3 AI model, powered predominantly by its proprietary Tensor Processing Units (TPUs), has forced OpenAI into a strategic pivot. The development triggered a 'code red' at Sam Altman's firm, with resources now being funneled into stabilizing ChatGPT's Core models. This shift underscores the growing dominance of vertically integrated hardware in the AI arms race.
The search giant is aggressively scaling its TPU operations, with plans to more than double production by 2028. A landmark deal supplying 1 million chips to Anthropic—valued in the tens of billions—has sent shockwaves through the semiconductor sector. Analysts at SemiAnalysis now position TPUs as competitive equals to Nvidia's offerings for both training and inference workloads.
Morgan Stanley estimates each 500,000 TPUs sold externally could generate $13 billion for Google. The bank projects TSMC will manufacture 3.2 million units in 2025, scaling to 7 million by 2028. This vertical integration strategy, developed with Broadcom and MediaTek, threatens to erode Nvidia's data center dominance as Google begins marketing TPUs beyond its cloud ecosystem.